首页> 外文OA文献 >Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping.
【2h】

Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping.

机译:脑机界面和视觉引导的自主机器人的融合提高了抓握过程中人工神经假肢的性能。

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

BACKGROUND: Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object.METHODS: Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps.RESULTS: Both subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control.CONCLUSIONS: Integration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users.TRIAL REGISTRATION: NCT01364480 and NCT01894802 .
机译:背景:最近的研究表明,脑机接口(BMI)为恢复上肢功能提供了巨大潜力。但是,抓握物体是一项复杂的任务,从大脑提取的信号可能并不总是能够可靠地驱动这些运动。视觉引导的机器人辅助是提高BMI性能的一种可能方法。我们描述了一种共享控制的方法,其中用户使用BMI控制假肢并在手接近物体时获得帮助来放置手部。方法:两个患有四肢瘫痪的人类受试者使用机械臂来完成有或没有共享的物体运输任务控制。共享控制系统旨在在BMI衍生的意图和计算机辅助之间取得平衡。自主的机器人抓取系统识别并跟踪对象,并为这些对象定义稳定的抓取位置。该系统根据BMI控制的机械臂运动,确定用户何时打算与某个对象进行交互。使用共享控制,将BMI控制的动作和自主抓握命令混合在一起,以确保安全地抓握。结果:与单独使用BMI控制相比,使用共享控制时,两个受试者在对象传输任务上都更加成功。使用共享控制进行的移动更加准确,更加有效并且难度更低。一位参与者尝试了多个对象的任务,并在92%的试验中成功举起了两个相距很近的对象之一,证明了用户在使用共享控件的同时准确执行其意图的潜力。结论:BMI控制与视觉引导机器人辅助的集成从而提高了对象传输任务的性能。在保持通用性的同时提供帮助将使BMI系统对潜在用户更具吸引力。试用注册:NCT01364480和NCT01894802。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号